Complete Reference Guide

Connection Strings
From Scratch

Every connection string format across databases, cloud services, message brokers, caches, and protocols β€” with anatomy breakdowns and copy-ready examples.

35+Platforms
10Categories
100+Examples
πŸ—„οΈ

Relational SQL Databases

Traditional structured databases with JDBC, ODBC, and driver-specific formats

🧬 Anatomy of a Connection String
Understanding the common parts before diving in
protocol://user:password@host:port/database?options
protocol β€” driver/engine
user β€” login name
password β€” credential
host β€” server address
port β€” TCP port
database β€” schema/db name
options β€” key=value params
πŸ’‘ Rule of thumb: Not all parts are always present. SQLite has no host/user. Some drivers use semicolons (;) instead of & for options. Always check your specific driver docs.
🐘 PostgreSQL
Most popular open-source relational DB
SQL
URL Format
Key-Value
Python/psycopg2
SSL
Standard URL
# Standard URL format (PostgreSQL + libpq)
postgresql://username:password@localhost:5432/mydb

# With options
postgresql://user:pass@host:5432/db?sslmode=require&connect_timeout=10

# postgres:// also works (alias)
postgres://user:pass@host/db
Key-Value (libpq)
# libpq key=value format
host=localhost port=5432 dbname=mydb
user=postgres password=secret
sslmode=require connect_timeout=10
Python (psycopg2 / SQLAlchemy)
import psycopg2
conn = psycopg2.connect(
    "postgresql://user:pass@localhost/db"
)

# SQLAlchemy engine
from sqlalchemy import create_engine
engine = create_engine(
    "postgresql+psycopg2://user:pass@host:5432/db"
)
With SSL (Production)
# sslmode options: disable, allow, prefer, require, verify-ca, verify-full
postgresql://user:pass@prod-server:5432/db
?sslmode=verify-full
&sslcert=/path/to/client.crt
&sslkey=/path/to/client.key
&sslrootcert=/path/to/ca.crt
Default port: 5432  |  Env var: DATABASE_URL (Heroku/Railway convention)
🐬 MySQL / MariaDB
World's most used open-source DB
SQL
URL Format
ODBC
Python
Standard URL
# Standard
mysql://user:password@localhost:3306/mydb

# With SSL and charset
mysql://user:pass@host:3306/db
?ssl=true&charset=utf8mb4

# MariaDB (same format)
mariadb://user:pass@host:3306/db
ODBC Connection String
Driver={MySQL ODBC 8.0 Driver};
Server=localhost;
Port=3306;
Database=mydb;
User=root;
Password=secret;
Option=3;
Python (mysql-connector / SQLAlchemy)
# mysql-connector-python
import mysql.connector
conn = mysql.connector.connect(
    host="localhost", port=3306,
    user="root", password="secret",
    database="mydb"
)
# SQLAlchemy
"mysql+mysqlconnector://user:pass@host/db"
"mysql+pymysql://user:pass@host/db"
Default port: 3306  |  Note: Always set charset=utf8mb4 for emoji support
πŸͺŸ SQL Server (MSSQL)
Microsoft's enterprise relational database
SQL
ADO.NET
ODBC
Python/Node
ADO.NET (C# / .NET)
// Standard SQL auth
Server=myserver.database.windows.net;
Database=mydb;
User Id=myuser;
Password=mypassword;
Encrypt=True;

// Windows Integrated Auth (no password needed)
Server=localhost\SQLEXPRESS;
Database=mydb;
Trusted_Connection=True;
TrustServerCertificate=True;
ODBC Format
Driver={ODBC Driver 18 for SQL Server};
Server=tcp:myserver,1433;
Database=mydb;
Uid=myuser;
Pwd={my_password};
Encrypt=yes;
TrustServerCertificate=no;
Python (pyodbc) / Node (mssql)
# Python SQLAlchemy
"mssql+pyodbc://user:pass@server/db?driver=ODBC+Driver+18+for+SQL+Server"

// Node.js (mssql package)
const config = {
  server: 'myserver',
  database: 'mydb',
  user: 'user',
  password: 'pass',
  port: 1433,
  options: { encrypt: true, trustServerCertificate: false }
};
Default port: 1433  |  Named instance: Server=host\INSTANCE_NAME
πŸ“¦ SQLite
Serverless file-based database
SQL
File Path (no host/user needed)
# Relative path
sqlite:///relative/path/to/db.sqlite

# Absolute path (note 4 slashes on Unix)
sqlite:////home/user/mydb.sqlite

# In-memory (lost when connection closes)
sqlite:///:memory:

# Python built-in
import sqlite3
conn = sqlite3.connect("mydb.sqlite")
conn = sqlite3.connect(":memory:")
No server needed β€” SQLite is a file. Perfect for dev, testing, and embedded apps. LangChain memory often uses SQLite.
πŸ”΄ Oracle DB
Enterprise-grade Oracle database
SQL
JDBC / cx_Oracle / SQLAlchemy
// JDBC URL
jdbc:oracle:thin:@hostname:1521:ORCL
// With service name (preferred in modern Oracle)
jdbc:oracle:thin:@//hostname:1521/service_name

# Python (cx_Oracle / python-oracledb)
DSN = "hostname:1521/service_name"
conn = cx_Oracle.connect(
    user="scott", password="tiger", dsn=DSN
)
# SQLAlchemy
"oracle+cx_oracle://user:pass@host:1521/?service_name=orcl"
Default port: 1521  |  SID vs Service: Use service name (newer). SID is legacy.
☁️ Cloud SQL Platforms
Supabase, PlanetScale, Neon, Render Postgres
Cloud SQL
Supabase (PostgreSQL)
# Supabase direct connection
postgresql://postgres:[password]@
db.xxxx.supabase.co:5432/postgres

# Supabase pooled (transaction mode via PgBouncer)
postgresql://postgres:[password]@
aws-0-us-east-1.pooler.supabase.com:6543/postgres

# Neon (serverless Postgres)
postgresql://user:pass@ep-xxx.us-east-2.aws.neon.tech/dbname
?sslmode=require

# PlanetScale (MySQL-compatible)
mysql://user:pass@xxx.connect.psdb.cloud/mydb
?ssl={"rejectUnauthorized":true}
πŸƒ

NoSQL / Document Databases

MongoDB, Firestore, DynamoDB and other schema-flexible stores

πŸƒ MongoDB
Document database with rich query language
NoSQL
Standard
Atlas Cloud
Python/Node
MongoDB URI Format
# Basic
mongodb://localhost:27017/mydb

# With auth
mongodb://user:password@localhost:27017/mydb
?authSource=admin

# Replica set
mongodb://user:pass@host1:27017,host2:27017/db
?replicaSet=myReplSet&readPreference=secondaryPreferred
MongoDB Atlas (SRV)
# Atlas uses +srv for DNS-based discovery (no port needed)
mongodb+srv://user:password@
cluster0.xxxxx.mongodb.net/mydb
?retryWrites=true&w=majority
&appName=MyApp
Python (pymongo) / Node (mongoose)
# Python
from pymongo import MongoClient
client = MongoClient("mongodb+srv://user:pass@cluster/db")
db = client["mydb"]

// Mongoose (Node.js)
await mongoose.connect(
  "mongodb+srv://user:pass@cluster/db",
  { useNewUrlParser: true, useUnifiedTopology: true }
);
Default port: 27017  |  Atlas tip: Always use +srv format for Atlas β€” it handles failover automatically
🌌 Azure Cosmos DB
Multi-model globally distributed database
NoSQL
Cosmos DB Connection Patterns
# NoSQL API (native SDK)
from azure.cosmos import CosmosClient
client = CosmosClient(
    url="https://myaccount.documents.azure.com:443/",
    credential="PRIMARY_KEY_HERE"
)

# MongoDB API (use mongo URI with SSL)
mongodb://account:PRIMARY_KEY@
account.mongo.cosmos.azure.com:10255/mydb
?ssl=true&retrywrites=false

# Connection string from portal
AccountEndpoint=https://account.documents.azure.com:443/;
AccountKey=BASE64KEY==;
⚠️ Security: Use Managed Identity or Azure Key Vault in production β€” never hardcode AccountKey
πŸ”₯ Firebase / Firestore
Google's real-time NoSQL cloud database
NoSQL
Firebase SDK Config (no URI β€” uses JSON credentials)
// Client SDK (browser/React Native)
import { initializeApp } from "firebase/app";
const firebaseConfig = {
  apiKey: "AIzaSy...",
  authDomain: "myapp.firebaseapp.com",
  projectId: "myapp-12345",
  storageBucket: "myapp.appspot.com",
  messagingSenderId: "123456789",
  appId: "1:123:web:abc..."
};
const app = initializeApp(firebaseConfig);

# Python Admin SDK (server-side)
import firebase_admin
cred = firebase_admin.credentials.Certificate(
    "serviceAccountKey.json"
)
firebase_admin.initialize_app(cred)
⚑ Amazon DynamoDB
AWS serverless key-value & document store
NoSQL
DynamoDB β€” Uses AWS credentials (no URI)
# Python (boto3)
import boto3
dynamodb = boto3.resource(
    "dynamodb",
    region_name="us-east-1",
    aws_access_key_id="AKIAIOSFODNN7",
    aws_secret_access_key="wJalrXUtnFEMI"
)
# Or use environment variables (preferred)
# AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_DEFAULT_REGION

# Local DynamoDB for dev
dynamodb = boto3.resource(
    "dynamodb",
    endpoint_url="http://localhost:8000"
)
🧠

Vector & Graph Databases

AI-native stores β€” semantic search, embeddings, and knowledge graphs

🎨 ChromaDB
Open-source vector DB for AI apps
Vector
ChromaDB Connection Modes
import chromadb

# In-memory (ephemeral β€” for testing)
client = chromadb.EphemeralClient()

# Persistent local (file-backed)
client = chromadb.PersistentClient(
    path="/path/to/chroma_data"
)

# Remote HTTP client (ChromaDB server running)
client = chromadb.HttpClient(
    host="localhost",
    port=8000
)

# Cloud (Chroma Cloud)
client = chromadb.cloud.CloudClient(
    tenant="my-tenant",
    database="my-db",
    api_key="chroma-api-key"
)
🌲 Pinecone
Managed vector database for ML
Vector
Pinecone (API Key Based)
from pinecone import Pinecone

# Initialize with API key
pc = Pinecone(api_key="your-api-key")

# Connect to index
index = pc.Index("my-index")

# Or specify host directly
index = pc.Index(
    name="my-index",
    host="https://my-index-xxx.svc.pinecone.io"
)
πŸ•ΈοΈ Weaviate
Open-source AI-native vector search engine
Vector
Weaviate Client
import weaviate
from weaviate.auth import AuthApiKey

# Local instance
client = weaviate.connect_to_local(
    host="localhost",
    port=8080, grpc_port=50051
)

# Weaviate Cloud Services (WCS)
client = weaviate.connect_to_wcs(
    cluster_url="https://cluster.weaviate.network",
    auth_credentials=AuthApiKey("wcs-api-key")
)
πŸ•·οΈ Neo4j
Graph database β€” nodes, edges, relationships
Graph
Neo4j Bolt / HTTP Connection
from neo4j import GraphDatabase

# Bolt protocol (default, binary, fast)
driver = GraphDatabase.driver(
    "bolt://localhost:7687",
    auth=("neo4j", "password")
)

# Neo4j Aura (cloud, TLS)
driver = GraphDatabase.driver(
    "neo4j+s://xxxxx.databases.neo4j.io",
    auth=("neo4j", "password")
)
# neo4j:// β€” bolt, no TLS
# neo4j+s:// β€” bolt with TLS
# neo4j+ssc:// β€” bolt with TLS, no cert validation
Ports: Bolt: 7687  |  HTTP: 7474  |  HTTPS: 7473
☁️

Azure Cloud Services

Storage, Service Bus, Event Hubs, AI Services and more

πŸ—ƒοΈ Azure Storage
Blob, Queue, Table, File storage
Azure
Storage Account Connection String
# Full connection string (from Azure Portal)
DefaultEndpointsProtocol=https;
AccountName=mystorageaccount;
AccountKey=BASE64_KEY==;
EndpointSuffix=core.windows.net

# Blob-specific URL (with SAS token)
https://account.blob.core.windows.net/container
?sv=2021-06-08&ss=b&srt=sco&sp=rwdlacuptfx
&se=2024-12-31T00%3A00%3A00Z&sig=SIGNATURE

# Python SDK (preferred β€” Managed Identity)
from azure.storage.blob import BlobServiceClient
from azure.identity import DefaultAzureCredential
client = BlobServiceClient(
    account_url="https://account.blob.core.windows.net",
    credential=DefaultAzureCredential()
)
🚌 Azure Service Bus
Enterprise messaging β€” queues and topics
Azure
Service Bus Connection String
# Shared Access Signature (SAS)
Endpoint=sb://mynamespace.servicebus.windows.net/;
SharedAccessKeyName=RootManageSharedAccessKey;
SharedAccessKey=BASE64_KEY=

# Python SDK
from azure.servicebus import ServiceBusClient
client = ServiceBusClient.from_connection_string(
    conn_str="Endpoint=sb://..."
)
sender = client.get_queue_sender(
    queue_name="my-queue"
)
πŸ“‘ Azure Event Hubs
Real-time data streaming ingestion service
Azure
Event Hubs Connection String
# Namespace-level (can send to any hub)
Endpoint=sb://mynamespace.servicebus.windows.net/;
SharedAccessKeyName=RootManageSharedAccessKey;
SharedAccessKey=BASE64_KEY=

# Entity-level (specific event hub)
Endpoint=sb://mynamespace.servicebus.windows.net/;
SharedAccessKeyName=send;
SharedAccessKey=KEY=;
EntityPath=my-event-hub
Note: Event Hubs and Service Bus share the same sb:// namespace format but use different SDKs
πŸ”· Azure SQL Database
Managed SQL Server in the cloud
Azure
Azure SQL Connection String
# ADO.NET
Server=tcp:myserver.database.windows.net,1433;
Initial Catalog=mydb;
Persist Security Info=False;
User ID=myuser;
Password=mypassword;
MultipleActiveResultSets=False;
Encrypt=True;
TrustServerCertificate=False;
Connection Timeout=30;
⚠️ Best practice: Use Azure Managed Identity instead of passwords β€” eliminates credential management entirely
πŸ€– Azure OpenAI / AI Services
Azure-hosted LLMs, Cognitive Services, AI Search
Azure AI
Azure OpenAI
from openai import AzureOpenAI
client = AzureOpenAI(
    api_key="AZURE_OPENAI_KEY",
    api_version="2024-02-01",
    azure_endpoint="https://myresource.openai.azure.com/"
)
# ENV VARS: AZURE_OPENAI_API_KEY
# AZURE_OPENAI_ENDPOINT, OPENAI_API_VERSION
Azure AI Search
from azure.search.documents import SearchClient
client = SearchClient(
    endpoint="https://myservice.search.windows.net",
    index_name="my-index",
    credential=AzureKeyCredential("ADMIN_KEY")
)
🟠

AWS Services

S3, RDS, SQS and AWS connection patterns

πŸͺ£ AWS S3
Simple Storage Service β€” object storage
AWS
S3 Paths & boto3
# S3 URI format (used in many tools)
s3://my-bucket/path/to/object.csv

# boto3 (uses AWS credentials from env/config)
import boto3
s3 = boto3.client(
    's3',
    region_name='us-east-1',
    aws_access_key_id='AKIAIOSFODNN7',
    aws_secret_access_key='wJalrXUtnFEMI'
)
# Best practice: use ~/.aws/credentials or IAM role
πŸ“¨ AWS SQS / SNS
Simple Queue Service / Notification Service
AWS
SQS Queue URL (not a classic connection string)
# SQS Queue URL format
https://sqs.us-east-1.amazonaws.com/
123456789012/MyQueueName

# boto3 usage
sqs = boto3.client('sqs', region_name='us-east-1')
sqs.send_message(
    QueueUrl="https://sqs.us-east-1.amazonaws.com/...",
    MessageBody="Hello"
)
πŸ—„οΈ AWS RDS
Managed relational DB (Postgres, MySQL, etc.)
AWS
RDS Endpoint (same as standard DB URLs)
# RDS uses standard DB connection strings β€” just with RDS hostname

# PostgreSQL on RDS
postgresql://admin:pass@
mydb.cluster-xxx.us-east-1.rds.amazonaws.com
:5432/mydb

# MySQL on RDS
mysql://admin:pass@
mydb.xxx.us-east-1.rds.amazonaws.com
:3306/mydb
IAM Auth: AWS RDS supports token-based IAM authentication β€” no password needed with proper IAM role
πŸ“¨

Message Brokers

RabbitMQ, Kafka, NATS β€” async communication between services

🐰 RabbitMQ
AMQP message broker
AMQP
AMQP URI Format
# AMQP (plain)
amqp://user:password@localhost:5672/vhost

# AMQPS (TLS)
amqps://user:password@broker.host.com:5671/vhost

# Python (pika)
import pika
params = pika.URLParameters(
    "amqp://guest:guest@localhost:5672/%2F"
)
conn = pika.BlockingConnection(params)
Default ports: AMQP: 5672  |  AMQPS: 5671  |  Management UI: 15672
πŸͺ΅ Apache Kafka
High-throughput distributed event streaming
Kafka
Kafka Bootstrap Servers (not a URI β€” comma-separated list)
# Kafka uses bootstrap servers list (no URI scheme)
bootstrap.servers=broker1:9092,broker2:9092

# Python (confluent-kafka)
from confluent_kafka import Producer
p = Producer({
    "bootstrap.servers": "localhost:9092"
})

# With SASL/SSL (cloud Kafka)
{
  "bootstrap.servers": "broker.cloud:9092",
  "security.protocol": "SASL_SSL",
  "sasl.mechanism": "PLAIN",
  "sasl.username": "API_KEY",
  "sasl.password": "API_SECRET"
}
πŸ”— NATS
Lightweight cloud-native messaging system
NATS
NATS URL
# Basic NATS
nats://localhost:4222

# With credentials
nats://user:token@nats.example.com:4222

# TLS
tls://nats.example.com:4222

# Python
import nats
nc = await nats.connect("nats://localhost:4222")
⚑

Cache & Key-Value Stores

Redis, Memcached, and in-memory data structures

πŸ”΄ Redis
In-memory data structure store
Cache
URL
Python
Cluster/TLS
Redis URI
# Basic (no auth)
redis://localhost:6379

# With password
redis://:mypassword@localhost:6379

# With username (Redis 6+)
redis://user:password@localhost:6379

# With DB index (0-15)
redis://localhost:6379/0

# TLS (rediss://)
rediss://user:pass@redis.cloud.com:6380
Python (redis-py)
import redis

# From URL
r = redis.Redis.from_url("redis://localhost:6379/0")

# Parameters
r = redis.Redis(
    host="localhost", port=6379, db=0,
    password="secret", decode_responses=True
)

# Async (aioredis)
import redis.asyncio as aioredis
r = await aioredis.from_url("redis://localhost")
Redis Cluster & Sentinel
# Redis Cluster
from redis.cluster import RedisCluster
rc = RedisCluster(
    startup_nodes=[
        {"host": "node1", "port": "6379"},
        {"host": "node2", "port": "6379"}
    ],
    decode_responses=True
)
# Upstash (serverless Redis, HTTPS)
rediss://default:TOKEN@xxx.upstash.io:6379
Default port: 6379  |  TLS: rediss:// (double-s)  |  Azure Cache for Redis uses port 6380 with SSL
🧠 Memcached
Simple, fast distributed memory caching
Cache
Memcached Connection
# Memcached has no URI standard β€” uses host:port tuples

# Python (pymemcache)
from pymemcache.client.base import Client
client = Client(("localhost", 11211))

# Multiple servers (pooling)
from pymemcache.client.hash import HashClient
client = HashClient([
    ("server1", 11211),
    ("server2", 11211)
])
Default port: 11211  |  No auth in basic Memcached
🌐

Network Protocols & Transfer

FTP, SSH/SFTP, SMTP, LDAP β€” lower-level connection strings

πŸ“‚ FTP / SFTP
File transfer protocols
Protocol
FTP / SFTP URIs
# FTP
ftp://user:password@ftp.example.com:21/path/to/dir

# FTPS (FTP over TLS)
ftps://user:pass@ftp.example.com:990

# SFTP (SSH File Transfer β€” different from FTPS)
sftp://user:pass@host.com:22/remote/path

# Python (paramiko SFTP)
import paramiko
ssh = paramiko.SSHClient()
ssh.connect("host", port=22, username="user", password="pass")
sftp = ssh.open_sftp()
πŸ“§ SMTP / IMAP / POP3
Email protocols
Email
Email Protocol Connections
# SMTP URIs
smtp://smtp.gmail.com:587        # STARTTLS
smtps://smtp.gmail.com:465       # SSL/TLS

# Python smtplib
import smtplib
smtp = smtplib.SMTP("smtp.gmail.com", 587)
smtp.starttls()
smtp.login("user@gmail.com", "app_password")

# IMAP (read email)
import imaplib
imap = imaplib.IMAP4_SSL("imap.gmail.com", 993)
imap.login("user@gmail.com", "app_password")
πŸ” LDAP / Active Directory
Directory services and authentication
Auth
LDAP URI Format
# LDAP URI
ldap://ldap.example.com:389

# LDAPS (TLS)
ldaps://ldap.example.com:636

# Python (ldap3)
from ldap3 import Server, Connection
server = Server("ldaps://ad.company.com:636")
conn = Connection(
    server,
    user="CN=svc,OU=Users,DC=company,DC=com",
    password="secret", auto_bind=True
)
LDAP ports: 389 (plain) / 636 (TLS)  |  DN format: CN=name,OU=unit,DC=domain,DC=com
🌐 WebSocket
Full-duplex real-time communication
WebSocket
WebSocket URI
# ws:// (plain), wss:// (TLS β€” always use in production)
ws://localhost:8765
wss://api.example.com/ws
wss://api.example.com/ws?token=JWT_TOKEN

# Python (websockets)
import websockets
async with websockets.connect(
    "wss://echo.websocket.org"
) as ws:
    await ws.send("Hello!")
    msg = await ws.recv()

// JavaScript
const ws = new WebSocket("wss://api.example.com/ws");
πŸ”‘

API & Auth Connection Patterns

REST APIs, OAuth, JWT β€” connecting to services via credentials

πŸ”‘ API Keys (REST)
Most common API authentication pattern
Auth
API Key Patterns
# 1. Header (most secure β€” recommended)
Authorization: Bearer YOUR_API_KEY
X-API-Key: YOUR_API_KEY

# 2. Query param (avoid in production β€” logs URLs)
https://api.service.com/data?api_key=KEY

# Python (requests)
import requests
headers = {"Authorization": f"Bearer {API_KEY}"}
resp = requests.get("https://api.example.com/", headers=headers)

# OpenAI SDK pattern
from openai import OpenAI
client = OpenAI(api_key="sk-...")
πŸ›‘οΈ OAuth 2.0 / JWT
Token-based delegated authorization
OAuth
OAuth Token Exchange
# OAuth2 token endpoint (Client Credentials flow)
POST https://auth.example.com/oauth/token
Content-Type: application/x-www-form-urlencoded

grant_type=client_credentials
&client_id=my-client-id
&client_secret=my-client-secret
&scope=read:data write:data

# Use returned token
Authorization: Bearer eyJhbGciOiJSUzI1NiJ9...
⚠️ Security: Never store tokens in localStorage. Use httpOnly cookies or secure server-side sessions. Rotate secrets regularly.
πŸ”— Database Connection Pools
Managing multiple connections efficiently
Best Practice
SQLAlchemy Connection Pool (Python)
from sqlalchemy import create_engine

engine = create_engine(
    "postgresql://user:pass@localhost/db",
    pool_size=10,         # persistent connections
    max_overflow=20,      # extra connections under load
    pool_timeout=30,      # wait time before error
    pool_recycle=1800,    # recycle connections (secs)
    pool_pre_ping=True    # test connection before use
)
Rule: Always use connection pools in production. Never open a new DB connection per request β€” it will kill performance.
πŸ“

Files, Search & Misc

Elasticsearch, HDFS, connection string security best practices

πŸ” Elasticsearch / OpenSearch
Distributed search & analytics engine
Search
Elasticsearch Client
from elasticsearch import Elasticsearch

# Basic local
es = Elasticsearch("http://localhost:9200")

# Cloud (Elastic Cloud)
es = Elasticsearch(
    cloud_id="my-cluster:xxx",
    api_key=("id", "api_key_value")
)

# With auth
es = Elasticsearch(
    "https://myhost:9200",
    basic_auth=("elastic", "password"),
    ca_certs="/path/to/cert.crt"
)
Default port: 9200 (HTTP) / 9300 (transport)  |  Azure AI Search is a managed alternative
πŸ”’ Security Best Practices
Never hardcode credentials β€” use these patterns
Security
The Right Way to Handle Connection Strings
# ❌ NEVER β€” hardcoded in source code
DB_URL = "postgresql://admin:supersecret@prod/db"

# βœ… Environment variables
import os
DB_URL = os.getenv("DATABASE_URL")

# βœ… .env file (dev only β€” add to .gitignore!)
# Use python-dotenv
from dotenv import load_dotenv
load_dotenv()

# βœ… Azure Key Vault (production)
from azure.keyvault.secrets import SecretClient
from azure.identity import DefaultAzureCredential
client = SecretClient(vault_url="https://kv.vault.azure.net/", credential=DefaultAzureCredential())
DB_URL = client.get_secret("db-connection-string").value

# βœ… AWS Secrets Manager
secrets = boto3.client("secretsmanager")
secret = secrets.get_secret_value(SecretId="my/db/conn")
πŸ“Š Quick Reference β€” Default Ports & Schemes
All the numbers you always forget
ServiceSchemeDefault PortTLS Port / SchemeAuth Pattern
PostgreSQLpostgresql://5432sslmode=requireuser:password
MySQLmysql://3306ssl=true paramuser:password
SQL ServerServer=host1433Encrypt=TrueUser Id / Trusted_Connection
Oraclejdbc:oracle:thin:@1521tcps:// schemeuser/password
MongoDBmongodb:// / mongodb+srv://27017mongodb+srv (Atlas)user:password@host
Redisredis://6379rediss:// port 6380:password@ or user:pass@
RabbitMQamqp://5672amqps:// port 5671user:password@host/vhost
Elasticsearchhttp://9200https:// + CA certbasic_auth / api_key
Neo4jbolt:// / neo4j://7687neo4j+s://user:password
SMTPsmtp://587 (STARTTLS)smtps:// port 465login(user, pass)
LDAPldap://389ldaps:// port 636DN string + password
FTPftp://21ftps:// port 990user:password@host
SFTP/SSHsftp://22Always encrypteduser:pass or key file
WebSocketws://80 (default)wss:// port 443token in URL/header